Path: blob/master/Part 8 - Deep Learning/Artificial Neural Networks/[Python] Artificial Neural Network.ipynb
1009 views
Kernel: Python 3
Artificial Neural Network(ANN)
I prefer using Google Colaboratory, as while training the model on training dataset it takes time if you want to generate for say 100 epochs, on your normal setup (While writing this code I am actually using my Laptop, instead of PC). It's better to use Colaboratory, as because it is fast.
Data Preprocessing
In [0]:
In [0]:
In [104]:
Out[104]:
In [0]:
In [0]:
In [107]:
Out[107]:
array([619, 'France', 42, 2, 0.0, 1, 1, 1, 101348.88], dtype=object)
In [108]:
Out[108]:
1
In [0]:
In [110]:
Out[110]:
array([[ 0.00000000e+00, 0.00000000e+00, 6.19000000e+02,
4.20000000e+01, 2.00000000e+00, 0.00000000e+00,
1.00000000e+00, 1.00000000e+00, 1.00000000e+00,
1.01348880e+05],
[ 0.00000000e+00, 1.00000000e+00, 6.08000000e+02,
4.10000000e+01, 1.00000000e+00, 8.38078600e+04,
1.00000000e+00, 0.00000000e+00, 1.00000000e+00,
1.12542580e+05]])
In [0]:
In [0]:
In [113]:
Out[113]:
array([[-0.5698444 , 1.74309049, 0.16958176, -0.46460796, 0.00666099,
-1.21571749, 0.8095029 , 0.64259497, -1.03227043, 1.10643166],
[ 1.75486502, -0.57369368, -2.30455945, 0.30102557, -1.37744033,
-0.00631193, -0.92159124, 0.64259497, 0.9687384 , -0.74866447]])
Importing the Keras libraries and packages
In [0]:
In [0]:
Initialization of ANN
In [0]:
In [117]:
Out[117]:
['K',
'Layer',
'__builtins__',
'__cached__',
'__doc__',
'__file__',
'__loader__',
'__name__',
'__package__',
'__spec__',
'absolute_import',
'deserialize',
'deserialize_keras_object',
'elu',
'get',
'hard_sigmoid',
'linear',
'relu',
'selu',
'serialize',
'sigmoid',
'six',
'softmax',
'softplus',
'softsign',
'tanh',
'warnings']
Adding the input layer and the first hidden layer
In [0]:
Adding more hidden layer(s) inbetween
In [0]:
Adding the output layer
In [0]:
In [121]:
Out[121]:
['Adadelta',
'Adagrad',
'Adam',
'Adamax',
'K',
'Nadam',
'Optimizer',
'RMSprop',
'SGD',
'TFOptimizer',
'__builtins__',
'__cached__',
'__doc__',
'__file__',
'__loader__',
'__name__',
'__package__',
'__spec__',
'absolute_import',
'adadelta',
'adagrad',
'adam',
'adamax',
'clip_norm',
'copy',
'deserialize',
'deserialize_keras_object',
'get',
'interfaces',
'nadam',
'rmsprop',
'serialize',
'serialize_keras_object',
'sgd',
'six',
'tf',
'zip']
In [122]:
Out[122]:
['K',
'KLD',
'MAE',
'MAPE',
'MSE',
'MSLE',
'__builtins__',
'__cached__',
'__doc__',
'__file__',
'__loader__',
'__name__',
'__package__',
'__spec__',
'absolute_import',
'binary_crossentropy',
'categorical_crossentropy',
'categorical_hinge',
'cosine',
'cosine_proximity',
'deserialize',
'deserialize_keras_object',
'get',
'hinge',
'kld',
'kullback_leibler_divergence',
'logcosh',
'mae',
'mape',
'mean_absolute_error',
'mean_absolute_percentage_error',
'mean_squared_error',
'mean_squared_logarithmic_error',
'mse',
'msle',
'poisson',
'serialize',
'serialize_keras_object',
'six',
'sparse_categorical_crossentropy',
'squared_hinge']
Compiling the ANN
In [0]:
Fitting the ANN to the Training set
In [124]:
Out[124]:
Epoch 1/100
8000/8000 [==============================] - 1s 174us/step - loss: 0.4840 - acc: 0.7960
Epoch 2/100
8000/8000 [==============================] - 1s 137us/step - loss: 0.4323 - acc: 0.7960
Epoch 3/100
8000/8000 [==============================] - 1s 148us/step - loss: 0.4270 - acc: 0.7995
Epoch 4/100
8000/8000 [==============================] - 1s 144us/step - loss: 0.4227 - acc: 0.8196
Epoch 5/100
8000/8000 [==============================] - 1s 145us/step - loss: 0.4205 - acc: 0.8274
Epoch 6/100
8000/8000 [==============================] - 1s 144us/step - loss: 0.4186 - acc: 0.8310
Epoch 7/100
8000/8000 [==============================] - 1s 153us/step - loss: 0.4170 - acc: 0.8317
Epoch 8/100
8000/8000 [==============================] - 1s 146us/step - loss: 0.4154 - acc: 0.8340
Epoch 9/100
4620/8000 [================>.............] - ETA: 0s - loss: 0.4058 - acc: 0.83518000/8000 [==============================] - 1s 152us/step - loss: 0.4147 - acc: 0.8329
Epoch 10/100
8000/8000 [==============================] - 1s 148us/step - loss: 0.4139 - acc: 0.8327
Epoch 11/100
8000/8000 [==============================] - 1s 149us/step - loss: 0.4130 - acc: 0.8334
Epoch 12/100
8000/8000 [==============================] - 1s 144us/step - loss: 0.4124 - acc: 0.8329
Epoch 13/100
8000/8000 [==============================] - 1s 141us/step - loss: 0.4118 - acc: 0.8322
Epoch 14/100
8000/8000 [==============================] - 1s 143us/step - loss: 0.4109 - acc: 0.8322
Epoch 15/100
8000/8000 [==============================] - 1s 143us/step - loss: 0.4106 - acc: 0.8344
Epoch 16/100
8000/8000 [==============================] - 1s 145us/step - loss: 0.4098 - acc: 0.8339
Epoch 17/100
6340/8000 [======================>.......] - ETA: 0s - loss: 0.4051 - acc: 0.83398000/8000 [==============================] - 1s 145us/step - loss: 0.4097 - acc: 0.8329
Epoch 18/100
8000/8000 [==============================] - 1s 145us/step - loss: 0.4098 - acc: 0.8326
Epoch 19/100
8000/8000 [==============================] - 1s 142us/step - loss: 0.4094 - acc: 0.8342
Epoch 20/100
8000/8000 [==============================] - 1s 142us/step - loss: 0.4091 - acc: 0.8325
Epoch 21/100
8000/8000 [==============================] - 1s 144us/step - loss: 0.4090 - acc: 0.8327
Epoch 22/100
8000/8000 [==============================] - 1s 141us/step - loss: 0.4085 - acc: 0.8336
Epoch 23/100
8000/8000 [==============================] - 1s 142us/step - loss: 0.4086 - acc: 0.8354
Epoch 24/100
8000/8000 [==============================] - 1s 143us/step - loss: 0.4084 - acc: 0.8331
Epoch 25/100
6420/8000 [=======================>......] - ETA: 0s - loss: 0.4086 - acc: 0.83498000/8000 [==============================] - 1s 143us/step - loss: 0.4082 - acc: 0.8347
Epoch 26/100
8000/8000 [==============================] - 1s 146us/step - loss: 0.4083 - acc: 0.8341
Epoch 27/100
8000/8000 [==============================] - 1s 141us/step - loss: 0.4070 - acc: 0.8330
Epoch 28/100
8000/8000 [==============================] - 1s 144us/step - loss: 0.4083 - acc: 0.8337
Epoch 29/100
8000/8000 [==============================] - 1s 141us/step - loss: 0.4075 - acc: 0.8332
Epoch 30/100
8000/8000 [==============================] - 1s 145us/step - loss: 0.4072 - acc: 0.8336
Epoch 31/100
8000/8000 [==============================] - 1s 142us/step - loss: 0.4077 - acc: 0.8350
Epoch 32/100
8000/8000 [==============================] - 1s 147us/step - loss: 0.4074 - acc: 0.8332
Epoch 33/100
5520/8000 [===================>..........] - ETA: 0s - loss: 0.4043 - acc: 0.83198000/8000 [==============================] - 1s 150us/step - loss: 0.4072 - acc: 0.8325
Epoch 34/100
8000/8000 [==============================] - 1s 150us/step - loss: 0.4070 - acc: 0.8355
Epoch 35/100
8000/8000 [==============================] - 1s 146us/step - loss: 0.4068 - acc: 0.8327
Epoch 36/100
8000/8000 [==============================] - 1s 141us/step - loss: 0.4075 - acc: 0.8342
Epoch 37/100
8000/8000 [==============================] - 1s 141us/step - loss: 0.4067 - acc: 0.8335
Epoch 38/100
8000/8000 [==============================] - 1s 138us/step - loss: 0.4069 - acc: 0.8325
Epoch 39/100
8000/8000 [==============================] - 1s 149us/step - loss: 0.4071 - acc: 0.8337
Epoch 40/100
8000/8000 [==============================] - 1s 144us/step - loss: 0.4068 - acc: 0.8325
Epoch 41/100
5790/8000 [====================>.........] - ETA: 0s - loss: 0.4083 - acc: 0.83208000/8000 [==============================] - 1s 141us/step - loss: 0.4069 - acc: 0.8336
Epoch 42/100
8000/8000 [==============================] - 1s 142us/step - loss: 0.4071 - acc: 0.8329
Epoch 43/100
8000/8000 [==============================] - 1s 145us/step - loss: 0.4068 - acc: 0.8340
Epoch 44/100
8000/8000 [==============================] - 1s 148us/step - loss: 0.4068 - acc: 0.8335
Epoch 45/100
8000/8000 [==============================] - 1s 141us/step - loss: 0.4062 - acc: 0.8347
Epoch 46/100
8000/8000 [==============================] - 1s 139us/step - loss: 0.4066 - acc: 0.8330
Epoch 47/100
8000/8000 [==============================] - 1s 141us/step - loss: 0.4063 - acc: 0.8342
Epoch 48/100
8000/8000 [==============================] - 1s 140us/step - loss: 0.4059 - acc: 0.8339
Epoch 49/100
5820/8000 [====================>.........] - ETA: 0s - loss: 0.4065 - acc: 0.83408000/8000 [==============================] - 1s 138us/step - loss: 0.4060 - acc: 0.8337
Epoch 50/100
8000/8000 [==============================] - 1s 140us/step - loss: 0.4062 - acc: 0.8342
Epoch 51/100
8000/8000 [==============================] - 1s 139us/step - loss: 0.4065 - acc: 0.8336
Epoch 52/100
8000/8000 [==============================] - 1s 142us/step - loss: 0.4063 - acc: 0.8332
Epoch 53/100
8000/8000 [==============================] - 1s 142us/step - loss: 0.4060 - acc: 0.8346
Epoch 54/100
8000/8000 [==============================] - 1s 143us/step - loss: 0.4063 - acc: 0.8339
Epoch 55/100
8000/8000 [==============================] - 1s 143us/step - loss: 0.4061 - acc: 0.8340
Epoch 56/100
8000/8000 [==============================] - 1s 147us/step - loss: 0.4064 - acc: 0.8342
Epoch 57/100
6370/8000 [======================>.......] - ETA: 0s - loss: 0.4051 - acc: 0.83478000/8000 [==============================] - 1s 142us/step - loss: 0.4063 - acc: 0.8342
Epoch 58/100
8000/8000 [==============================] - 1s 144us/step - loss: 0.4055 - acc: 0.8345
Epoch 59/100
8000/8000 [==============================] - 1s 145us/step - loss: 0.4063 - acc: 0.8331
Epoch 60/100
8000/8000 [==============================] - 1s 145us/step - loss: 0.4062 - acc: 0.8334
Epoch 61/100
8000/8000 [==============================] - 1s 141us/step - loss: 0.4065 - acc: 0.8326
Epoch 62/100
8000/8000 [==============================] - 1s 151us/step - loss: 0.4060 - acc: 0.8340
Epoch 63/100
8000/8000 [==============================] - 1s 144us/step - loss: 0.4062 - acc: 0.8344
Epoch 64/100
8000/8000 [==============================] - 1s 142us/step - loss: 0.4061 - acc: 0.8339
Epoch 65/100
6530/8000 [=======================>......] - ETA: 0s - loss: 0.4066 - acc: 0.83528000/8000 [==============================] - 1s 140us/step - loss: 0.4059 - acc: 0.8357
Epoch 66/100
8000/8000 [==============================] - 1s 138us/step - loss: 0.4060 - acc: 0.8354
Epoch 67/100
8000/8000 [==============================] - 1s 144us/step - loss: 0.4058 - acc: 0.8322
Epoch 68/100
8000/8000 [==============================] - 1s 139us/step - loss: 0.4059 - acc: 0.8349
Epoch 69/100
8000/8000 [==============================] - 1s 139us/step - loss: 0.4054 - acc: 0.8351
Epoch 70/100
8000/8000 [==============================] - 1s 143us/step - loss: 0.4057 - acc: 0.8340
Epoch 71/100
8000/8000 [==============================] - 1s 143us/step - loss: 0.4059 - acc: 0.8341
Epoch 72/100
8000/8000 [==============================] - 1s 142us/step - loss: 0.4055 - acc: 0.8347
Epoch 73/100
6810/8000 [========================>.....] - ETA: 0s - loss: 0.4021 - acc: 0.83558000/8000 [==============================] - 1s 142us/step - loss: 0.4057 - acc: 0.8336
Epoch 74/100
8000/8000 [==============================] - 1s 140us/step - loss: 0.4050 - acc: 0.8329
Epoch 75/100
8000/8000 [==============================] - 1s 140us/step - loss: 0.4061 - acc: 0.8326
Epoch 76/100
8000/8000 [==============================] - 1s 144us/step - loss: 0.4058 - acc: 0.8345
Epoch 77/100
8000/8000 [==============================] - 1s 141us/step - loss: 0.4060 - acc: 0.8347
Epoch 78/100
8000/8000 [==============================] - 1s 142us/step - loss: 0.4055 - acc: 0.8341
Epoch 79/100
8000/8000 [==============================] - 1s 139us/step - loss: 0.4061 - acc: 0.8336
Epoch 80/100
8000/8000 [==============================] - 1s 138us/step - loss: 0.4055 - acc: 0.8331
Epoch 81/100
7470/8000 [===========================>..] - ETA: 0s - loss: 0.4054 - acc: 0.83368000/8000 [==============================] - 1s 143us/step - loss: 0.4057 - acc: 0.8341
Epoch 82/100
8000/8000 [==============================] - 1s 140us/step - loss: 0.4056 - acc: 0.8334
Epoch 83/100
8000/8000 [==============================] - 1s 142us/step - loss: 0.4058 - acc: 0.8335
Epoch 84/100
8000/8000 [==============================] - 1s 142us/step - loss: 0.4061 - acc: 0.8346
Epoch 85/100
8000/8000 [==============================] - 1s 143us/step - loss: 0.4055 - acc: 0.8340
Epoch 86/100
8000/8000 [==============================] - 1s 141us/step - loss: 0.4060 - acc: 0.8344
Epoch 87/100
8000/8000 [==============================] - 1s 141us/step - loss: 0.4056 - acc: 0.8357
Epoch 88/100
8000/8000 [==============================] - 1s 143us/step - loss: 0.4056 - acc: 0.8339
Epoch 89/100
6830/8000 [========================>.....] - ETA: 0s - loss: 0.4063 - acc: 0.83198000/8000 [==============================] - 1s 142us/step - loss: 0.4056 - acc: 0.8326
Epoch 90/100
8000/8000 [==============================] - 1s 144us/step - loss: 0.4058 - acc: 0.8347
Epoch 91/100
8000/8000 [==============================] - 1s 144us/step - loss: 0.4055 - acc: 0.8342
Epoch 92/100
8000/8000 [==============================] - 1s 142us/step - loss: 0.4058 - acc: 0.8349
Epoch 93/100
8000/8000 [==============================] - 1s 144us/step - loss: 0.4057 - acc: 0.8344
Epoch 94/100
8000/8000 [==============================] - 1s 143us/step - loss: 0.4055 - acc: 0.8342
Epoch 95/100
8000/8000 [==============================] - 1s 147us/step - loss: 0.4049 - acc: 0.8337
Epoch 96/100
8000/8000 [==============================] - 1s 142us/step - loss: 0.4057 - acc: 0.8344
Epoch 97/100
6480/8000 [=======================>......] - ETA: 0s - loss: 0.4054 - acc: 0.83448000/8000 [==============================] - 1s 142us/step - loss: 0.4054 - acc: 0.8334
Epoch 98/100
8000/8000 [==============================] - 1s 142us/step - loss: 0.4056 - acc: 0.8332
Epoch 99/100
8000/8000 [==============================] - 1s 143us/step - loss: 0.4059 - acc: 0.8345
Epoch 100/100
8000/8000 [==============================] - 1s 143us/step - loss: 0.4056 - acc: 0.8334
<keras.callbacks.History at 0x7f43dd4f69b0>
Predicting the Test set results
In [0]:
In [126]:
Out[126]:
array([[False],
[False],
[False],
[False],
[False],
[ True],
[False],
[False],
[False],
[ True]], dtype=bool)
In [127]:
Out[127]:
array([0, 1, 0, 0, 0, 1, 0, 0, 1, 1])
Making the confussion Matrix
In [128]:
Out[128]:
array([[1548, 47],
[ 278, 127]])
Calculating Accuracy
In [129]:
Out[129]:
0.85450000000000004